Training MLP Network Using Scatter Search as a Global Optimizer
نویسندگان
چکیده
Many real life optimization problems are nonconvex and may have several local minima within their feasible region. Therefore, global search methods are needed. Metaheuristics are efficient global optimizers including a metastrategy that guides a heuristic search. Genetic algorithms, simulated annealing, tabu search and scatter search are the most well-know metaheuristics. In general, they do not require optimization problems to be differentiable or feasible regions to be connected. Hence metaheuristics are applicable to a large number of problems.
منابع مشابه
آموزش شبکه عصبی MLP در فشردهسازی تصاویر با استفاده از روش GSA
Image compression is one of the important research fields in image processing. Up to now, different methods are presented for image compression. Neural network is one of these methods that has represented its good performance in many applications. The usual method in training of neural networks is error back propagation method that its drawbacks are late convergence and stopping in points of lo...
متن کاملBounding the Search Space for Global Optimization of Neural Networks Learning Error: An Interval Analysis Approach
Training a multilayer perceptron (MLP) with algorithms employing global search strategies has been an important research direction in the field of neural networks. Despite a number of significant results, an important matter concerning the bounds of the search region— typically defined as a box—where a global optimization method has to search for a potential global minimizer seems to be unresol...
متن کاملG - Prop - III : Global Optimization of Multilayer Perceptrons using anEvolutionary
This paper proposes a new version of a method (G-Prop-III, genetic backpropaga-tion) that attempts to solve the problem of nding appropriate initial weights and learning parameters for a single hidden layer Mul-tilayer Perceptron (MLP) by combining a genetic algorithm (GA) and backpropagation (BP). The GA selects the initial weights and the learning rate of the network, and changes the number o...
متن کاملNovel Mlp Neural Network with Hybrid Tabu Search Algorithm
In this paper, we propose a new global and fast Multilayer Perceptron Neural Network (MLP-NN) which can be used to forecast the automotive price. Nowadays, the gradient-based techniques, such as back propagation, are widely used for training neural networks. These techniques have local convergence results and, therefore, can perform poorly even on simple problems when forecasting is out of samp...
متن کاملMulti-layer Perceptron Error Surfaces: Visualization, Structure and Modelling
The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural Network model. MLP networks are normally applied to performing supervised learning tasks, which involve iterative training methods to adjust the connection weights within the network. This is commonly formulated as a multivariate non-linear optimization problem over a very high-dimensional space ...
متن کامل